Entropy-Based Concentration Inequalities for Dependent Variables

نویسندگان

  • Liva Ralaivola
  • Massih-Reza Amini
چکیده

We provide new concentration inequalities for functions of dependent variables. The work extends that of Janson (2004), which proposes concentration inequalities using a combination of the Laplace transform and the idea of fractional graph coloring, as well as many works that derive concentration inequalities using the entropy method (see, e.g., (Boucheron et al., 2003)). We give inequalities for fractionally sub-additive and fractionally self-bounding functions. In the way, we prove a new Talagrand concentration inequality for fractionally sub-additive functions of dependent variables. The results allow us to envision the derivation of generalization bounds for various applications where dependent variables naturally appear, such as in bipartite ranking.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Concentration of Measure Inequalities in Information Theory, Communications and Coding

Concentration inequalities have been the subject of exciting developments during the last two decades, and they have been intensively studied and used as a powerful tool in various areas. These include convex geometry, functional analysis, statistical physics, statistics, pure and applied probability theory (e.g., concentration of measure phenomena in random graphs, random matrices and percolat...

متن کامل

Concentration of Measure Inequalities and Their Communication and Information-Theoretic Applications

During the last two decades, concentration of measure has been a subject of various exciting developments in convex geometry, functional analysis, statistical physics, high-dimensional statistics, probability theory, information theory, communications and coding theory, computer science, and learning theory. One common theme which emerges in these fields is probabilistic stability: complicated,...

متن کامل

Concentration of Measure Inequalities in Information Theory, Communications and Coding (Second Edition)

Concentration inequalities have been the subject of exciting developments during the last two decades, and have been intensively studied and used as a powerful tool in various areas. These include convex geometry, functional analysis, statistical physics, mathematical statistics, pure and applied probability theory (e.g., concentration of measure phenomena in random graphs, random matrices, and...

متن کامل

Concentration of Measure Inequalities in Information Theory, Communications, and Coding Concentration of Measure Inequalities in Information Theory, Communications, and Coding

During the last two decades, concentration inequalities have been the subject of exciting developments in various areas, including convex geometry, functional analysis, statistical physics, high-dimensional statistics, pure and applied probability theory (e.g., concentration of measure phenomena in random graphs, random matrices, and percolation), information theory, theoretical computer scienc...

متن کامل

Some Probability Inequalities for Quadratic Forms of Negatively Dependent Subgaussian Random Variables

In this paper, we obtain the upper exponential bounds for the tail probabilities of the quadratic forms for negatively dependent subgaussian random variables. In particular the law of iterated logarithm for quadratic forms of independent subgaussian random variables is generalized to the case of negatively dependent subgaussian random variables.

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2015